Probability and Possibility Automaton Learning Network
نویسندگان
چکیده
منابع مشابه
On Possibility/probability Transformations
The problem of converting possibility measures into probability measures has received attention in the past, but not by so many scholars. This question is philosophically interesting as part of the debate between probability and fuzzy sets. The imbedding of fuzzy sets into random set theory as done by Goodman and Nguyen (1985), Wang Peizhuang (1983), among others, has solved this question in pr...
متن کاملVariable Probability-possibility Transformation
Our team of research works on diagnosis by pattern recognition using Fuzzy Pattern Matching (FPM) as a classification method, for data coming from industrial and medical sectors. FPM uses a transformation from probability to possibility in order to construct densities of possibilities. These densities are used to assign each new sample to its suitable class which corresponds to a functioning mo...
متن کاملThe Neural Network Pushdown Automaton: Model, Stack and Learning Simulations
In order for neural networks to learn complex languages or grammars, they must have sufficient computational power or resources to recognize or generate such languages. Though many approaches have been discussed, one obvious approach to enhancing the processing power of a recurrent neural network is to couple it with an external stack memory in effect creating a neural network pushdown automata...
متن کاملProbability- Possibility Transformations:A Brief Revisit
In this article, an annotated survey of the approaches to the transformation from probability to possibility or conversely is provided and noticeable properties of the transformation are discussed. This article proposes an approach which is different from the one described by some authors in the literature of fuzzy set theory. The alternative proposal which advocates reconstructions is presente...
متن کاملLearning Deterministic Finite Automaton with a Recurrent Neural Network
We consider the problem of learning a nite automaton with recurrent n e u r a l networks from positive evidence. We train Elman recurrent neural networks with a set of sentences in a language and extract a nite automaton by clustering the states of the trained network. We observe that the generalizations beyond the training set, in the language recognized by the extracted automaton, are due to ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: IEEJ Transactions on Industry Applications
سال: 1998
ISSN: 0913-6339,1348-8163
DOI: 10.1541/ieejias.118.291